Automatic Variational Inference in Stan
نویسندگان
چکیده
Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it di cult for non-experts to use. We propose an automatic variational inference algorithm, automatic di erentiation variational inference ( ); we implement it in Stan (code available), a probabilistic programming system. In the user provides a Bayesian model and a dataset, nothing else. We make no conjugacy assumptions and support a broad class of models. The algorithm automatically determines an appropriate variational family and optimizes the variational objective. We compare to sampling across hierarchical generalized linear models, nonconjugate matrix factorization, and a mixture model. We train the mixture model on a quarter million images. With we can use variational inference on any model we write in Stan.
منابع مشابه
Automatic Differentiation Variational Inference
Modern data analysis requires an iterative cycle: in the probabilistic modeling framework, a sim1 ple model is fit to the data, and it is refined as we gather more knowledge about the data’s hidden 2 structure. However, fitting complex models to large datasets is mathematically and computationally 3 challenging. We develop an automated tool called automatic differentiation variational inference...
متن کاملFast Black-box Variational Inference through Stochastic Trust-Region Optimization
We introduce TrustVI, a fast second-order algorithm for black-box variational inference based on trust-region optimization and the “reparameterization trick.” At each iteration, TrustVI proposes and assesses a step based on minibatches of draws from the variational distribution. The algorithm provably converges to a stationary point. We implement TrustVI in the Stan framework and compare it to ...
متن کاملDeep Probabilistic Programming
We propose Edward, a Turing-complete probabilistic programming language. Edward builds on two compositional representations—random variables and inference. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as flexible and computationally efficient as traditional deep learning. For flexibility, Edward makes it easy to fit the sa...
متن کاملStan: A Probabilistic Programming Language
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.2.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian...
متن کاملAutomatic Variational ABC
Approximate Bayesian Computation (ABC) is a framework for performing likelihood-free posterior inference for simulation models. Stochastic Variational inference (SVI) is an appealing alternative to the inefficient sampling approaches commonly used in ABC. However, SVI is highly sensitive to the variance of the gradient estimators, and this problem is exacerbated by approximating the likelihood....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015